Skip to content

Vigilant

7
Share

Vigilant

Home / Vigilant
Original Fiction Cyberpunk

Vigilant

In a new Little Brother story, when schools make war on their own students, something has to give . . .

Illustrated by Will Staehle

Edited by

By

Published on September 26, 2024

7
Share
An illustration of a mobile phone on an orange background; the screen of the phone is cracked forming the shape of a star, with a red X at the center.

Kids hate email.

Dee got my number from his older brother, who got it from Tina, my sister-in-law, who he knew from art school. He texted me just as I was starting to make progress with a gnarly bug in some logging software I was trying to get running for my cloud servers.

My phone went bloop and vibrated a little on the kitchen table, making ripples in my coffee. My mind went instantly blank. I unlocked my phone.

> Is this marcus

I almost blocked the number, but dammit, this was supposed to be a private number. I’d just changed it. I wanted to know how it was getting out and whether I needed to change it again.

> Who’s this?

Yeah, I punctuate my texts. I’m old.

> I need help with some school stuff some spying stuff at school i heard your good at that

His name was Dee and he was fifteen years old and he had clinical anxiety. He lived in Oakland, and he had a Section 504 waiver from school that let him stay home on days when things were bad. The school had cameras in every classroom, left over from the hybrid education days, so he could Zoom in when he needed to.

Dee’s anxiety was especially bad when he had to take tests. Back in middle school, he’d had cool teachers who let him do projects instead of taking high-stakes tests. That was a big nope at Oakland High.

Which presented a dilemma: if Dee’s teachers were going to evaluate his learning by testing him, and if Dee couldn’t take a test at school, how were they going to be sure he didn’t cheat?

“They use what?” Dee flinched and I realized how screechy my voice had gotten. “Sorry,” I said.

Dee looked down into his latte and I felt bad. I’d asked him where he wanted to meet and he’d suggested this coffee shop near the Fruitvale BART station, which reminded me of the kinds of places I’d loved to hang out in when I was in high school—lots of dusty sofas and a shelf where you could take or leave free books.

I’d gotten there early and soaked it all in while I waited for him, wondering if I’d recognize him. It turned out to be easy. Dee was a tall kid, but habitually ducked so low he almost bowed. He seemed scared and sad, and he kept his eyes fixed on a spot on the floor about a yard ahead of his feet. I saw the barista recognize him and smile and then shake his head, and I felt bad and good at the same time. Bad that Dee was having such a hard time, good that he had a barista who understood him and would take care of him.

“Yo, Dee,” the barista said. She was only a couple years older than him but she radiated all the self-confidence he lacked. I’d watched her do showy flips with the portafilters and jugs while making coffees and decided I liked her.

He raised his head and smiled at her. What a smile, though. Dee smiled like a saint in a Renaissance painting. It was a smile that lit up the barista, too, and she gave him a smile of her own back that echoed it.

I liked him right away, and wanted to help him.

I waved at him as he collected his coffee, but he had already spotted me and was heading my way. I guess he had good peripheral vision and the situational awareness of someone with a lot of uncontrolled anxiety.

As he got closer, I saw that his jeans had some kind of weird patina or stain or something, and then as he got closer still, I saw that it was ink doodles, intricate and interlocking, a mix of geometric shapes and abstract human faces and figures. It was hypnotic. So hypnotic, in fact, that I didn’t even notice that the doodles continued over his hands and forearms until he stooped to put his latte down on the coffee table, drop his backpack, and settled into the armchair opposite the sofa I was trapped in.

“You must be Dee?”

He ducked his head and muttered a tiny “yes.” Ange and I had read up on anxiety before this meeting and I didn’t take it personally.

“It’s nice to meet you. Tell me what’s going on, and how I can help?”

What he told me next was so extraordinary and dystopian that it caused me to yelp, “They use what?” and startle the poor kid. I felt terrible, I apologized, but I was stunned.

Dee’s school was worried that kids who did their tests remotely would cheat, because at Dee’s school, they conceived of education as an adversarial process: the teachers wanted to cram education into the students’ brains, and the students wanted to avoid being educated. Tests were a way for teachers to know if they could stop cramming, and if the students could, they would 100 percent fake the test results so that the cramming would end.

If this strikes you as a weird and terrible way to think about education, well, yeah, it is, isn’t it?

If students are the enemy on the testing battlefield, then they can’t be trusted to take the tests on their own honor. Instead, they have to be guarded over.

But how do you police a student’s test-taking when they’re at home, using their own computer?

Simple: “Remote invigilation.”

I know. What the hell is that?

Well, “invigilation” is a fancy word for “proctoring”—that is, standing guard over a test-taker. So remote invigilation is guarding a student at a distance. And guess what?

There’s an app for that.

A very, very expensive app.

Integretron was the Oakland Unified School District’s remote invigilation vendor. They were nowhere near the top of the pack, but they were trying their darndest to catch up by offering their customers the most invasive and disgusting feature-sheet in the industry.

On the surface, Integretron worked just like its big competitors, like Proctorio and Honorlock. Students began their testing by providing a 360-degree webcam tour of the room to prove they were alone and that there weren’t any cheat sheets that the test-taker could consult.

Of course, if you shared a room with your parents or grandparents or uncles or cousins, you had to chase them out or you’d flunk automatically. That was especially hard on people who shared a room with someone who had to work the night shift. And guess what? People who have to work night shifts tend to be poor and desperate—exactly the sort of people who end up having to double up on bedrooms.

This is America, so being poor is also correlated with being brown, and that meant that the facial-recognition stuff built into Integretron had a hard time making sense out of your face. Dee’s skin was lighter than his parents’, but he was still dark enough that he needed to shine two task lights directly into his face for the system to recognize him.

Facial recognition?

Yeah, the system used facial recognition. That way it could tell if a test-taker looked to the side, or down, or up, or anywhere that wasn’t right in the middle of the screen where the test form was. Are you the kind of person who looks up or around the room while thinking? Me too. Integretron’s robots would fail us both in a heartbeat.

Do you whisper to yourself while you’re thinking? Fail. Do you have a dog or a cat or a baby in your house who might make a noise your laptop mic would pick up? Fail. Do you have a disability that requires you to urgently go to the toilet? Fail. Do you throw up when you’re anxious? Fail. Are you experiencing labor pains? Fail, fail, fail.

Even if you’re the kind of person who thinks testing is a good way to measure learning, you probably agree that there should be a human in the loop, not just a janky, racist facial recognition system.

Don’t worry, they’ve got humans. Oh boy do they have humans.

Like its competitors, Integretron pays an army of low-waged subcontractors around the Pacific Rim to spy on test-takers through their webcams. But these human guards aren’t charged with double-checking the strikes the algorithm assigns. No, they’re in charge of drumming up new offenses for test-takers that the algorithm might miss.

These human monitors have an incredibly boring job, and a bizarre amount of unchecked power. When you run Integretron’s test-taking software, it sinks its hooks deep into your operating system, giving the company’s subcontractors unlimited access to your computer. Sometimes, these spies get tired of just watching you work through your test, and seize control of your mouse-pointer and jiggle it, as if to say, come on, come on, let’s get going here.

If that sounds like it might distract you from taking a test, just put yourself in Dee’s seat. Imagine that you’re a Black kid from Oakland with two ultra-bright task lights shining directly into your eyes, which have to stay locked precisely on the middle of your screen, as a bored proctor an ocean away jiggles your mouse-pointer, while you’re coping with crippling anxiety.

Dee is a good student. His project work gets top marks. But his tested grades were so poor that he was in danger of flunking every class in the tenth grade. It got so bad that he did something he’d never done before: he cheated. He tried to look up the answers to a social studies exam on the Spanish Inquisition, because he froze. He’d smuggled in an old phone that he’d contrived to prop against his screen after the initial room inspection, a phone he’d paired with a miniature gamer’s Bluetooth keyboard that he’d taught himself to touch-type on. The little Chiclet keys were almost noiseless, and he literally hid the keyboard up his sleeve and surreptitiously tipped it out onto his desk, where he could type on it with his right hand without showing any suspicious shoulder movements.

Predictably, he froze on the first test question. Telling himself that he was only priming the pump, he touch-typed the question into a Google search-bar on the phone, and then used the keyboard’s miniature trackball to click the top search result, at 123testanswers.xyz.

Clicking that link felt dirty, but he clicked it, and he was strangely relieved when the link went to a page with an obscure error message and not the answer. In fact, he was so relieved that it broke through his anxiety and he aced the test. He felt triumphant as he put the final touches on a truly excellent essay question response and signed off and whooshed out a huge breath and collapsed onto his bed.

He got himself a snack and came back into his room. He reached for his phone and took it out of do-not-disturb mode and watched as it lit up in a blaze of notifications.

ACADEMIC DISHONESTY, the first one began.

CHEATING, began the second.

DISCIPLINARY PROCEEDINGS, went the third.

After that, Dee stopped reading, turned off his device, and cried.

“It was a honeypot?” I said.

He looked down. “I don’t know what that means.”

“Sorry,” I said. “Sorry. I get to talking in hacker jargon sometimes. It was a trap, I mean. The answers site was a fake. They used search engine optimization to get to the top of Google, then captured your phone’s IP address and sent it to the system. Then they checked to see if there was anyone at that address taking a test on a different device.”

He shrugged. “I guess. That’s what I figured, anyway.”

“So then what happened?”

He sighed. “A two-week suspension. Two months of academic probation. That ends this week and I—” He took a few deep breaths. “This has all been really hard for me. For my…For the anxiety. So I’m going to stay home. For a while. But they say that because I committed academic dishonesty, I need to have Integretron on all the time.”

“That’s so gross. Oh, man, is that gross.”

“It is.”

“You should talk to a lawyer. That can’t be legal. I mean, the Americans with Disabilities Act—”

“We have a lawyer. She says it will be a long time before we get anything out of them. Maybe never. And I have to go back to school on Monday.”

“Man, I’m so sorry, Dee. So, what can I do to help with this?”

He stopped looking at his decorated hands and jeans and the remainder of his latte and locked gazes with me. “I want to hack Integretron.”

I was nervous that it might be hard to get an Integretron account so I could start to probe it, but it was disturbingly easy. With remote learning numbers falling after the pandemic, Integretron had embarked upon an indiscriminate sales blitz. I created a fake charter school—The Ten Commandments Excellence Academy—and used a procedural website generator to create three years’ worth of blog posts, sales brochures and About Us pages that I back-dated in a WordPress blog I stood up on a free hosting site.

It had been a while since I’d created a fake website, and I was amazed by how much easier it had gotten: all I did was feed candidate texts from actual shitty charter schools to GPT-3, a public machine learning text generator trained on 45 terabytes of text, and it spat out word-salad that read as though it was written by a spittle-flecked manic entrepreneur hoping to make a fortune off of religious parents who didn’t want their kids to learn about butt stuff in sex ed.

I probably didn’t need to bother. For all that Integretron was a company founded on suspicion and accusation, the sales rep who called me after I requested a demo account was incredibly eager to buy my story and walk me through the features that set Integretron apart from the market leaders like Proctorio and Honorlock.

Lucky for me, one way that Integretron set itself apart was in its willingness to share its documentation. Proctorio had elevated legal bullying of its critics to an artform. They sued ’em for libel. They sued ’em for copyright infringement, arguing that linking to the YouTube videos the company itself had uploaded to boast about how creepy its products were was a violation of copyright law (they had set those videos to “unlisted” but anyone with a link could see them).

By contrast, Integretron was a paragon of transparency, and all of its docs were right there in the open, where I could get at them. The more I read, the sicker I felt. The company embodied an educational philosophy that had more in common with warfare than pedagogy. For them, students and teachers were enemies, and education was a battlefield: the teacher’s job was to cram as much knowledge as possible into the student’s head without being deceived about the student’s understanding of the subject. The student’s victory condition was to achieve a passing grade while learning as little as possible.

Dee was a motivated kid. He wanted to learn. His essay question response on information control during the Inquisitions was beautifully written and insightful. I’d always had a Monty Python–level understanding of the Inquisitions, assuming they were about banning heresy. Dee made me understand that they were about controlling heresy, explaining that the Inquisitors made a point of saving all the books they’d “banned” in private libraries, where access could be tightly regulated. The irony of this philosophy of knowledge in light of the punishment Dee was experiencing courtesy of Integretron and his school wasn’t lost on me.

Dee wanted to learn. If there was part of a subject he didn’t get, he wanted to have his teachers know about it and help him grok it. In theory, he understood that tests were a way to help that process along, but in practice, he understood that the point of a test was to figure out if he had been a “good student” and to discipline him if he hadn’t.

“The way it’s set up, either I did something wrong and didn’t learn right, or my teacher did something wrong and didn’t teach right. That’s their whole approach.” He was a smart kid. We were checking in every day on an audio-only link, where he was a regular chatterbox. He reminded me of some of the old phone phreaks I’d met in the SF hacker scene, extremely socially awkward people in person who were absolute charmers when they could present as a disembodied voice.

Literal charmers: the phreaks invented “social engineering”—calling up a Pac Bell office pretending to be a linesman thirty-five feet up a telephone pole who needed them to look up a key piece of information they’d left down in the truck. I’d gone to the social engineering competitions at Defcon, where live smooth talkers called up different giant companies and tried to get pieces of information out of them (the companies’ chief security officers volunteered them for this duty and no one got in trouble).

I’d assumed at the time that these bullshit artists were consummate actors, or maybe sociopaths, or maybe both. But talking to Dee v2v, I realized that he was someone whose neurological peccadillos made it hard to express himself f2f, but that just meant that he had all this time trapped in his own silence, time where he could form insights and observe fine details, and when he could open his mouth, what came out was pure flow.

“Have you talked to your teachers about this? I mean, that’s an excellent point.”

A long silence let me know I’d asked something stupid. Of course he hadn’t talked to his teachers about it. He could barely talk to them about anything.

“I mean, as long as any wrong answer I give on a test is either my fault or their fault, the teachers are going to go with my fault, right?”

The kid had a point.

Integretron missed the point. They claimed that, in addition to detecting “obvious cheating behavior” (like looking away from your screen), they could also find subtler “warning signs” like “microexpressions.” I dove down a research rabbit hole about these for a couple hours before concluding that this was about as reliable and scientific as astrology, and that Integretron was using it to alert teachers about suspected “academic dishonesty.”

They also had an “acoustic event classifier” that would tag any incidental sounds with guesses about its origins, along with a percentage-based confidence rating, to three decimal places. In the promotional video I watched, a duncey-looking oaf of a kid made a confused scowl at his screen, then subtly slid a cheat sheet out from under his keyboard; the paper rustled softly and an inset box showed the alert this generated: “paper rustling 97.659% confidence.” I wondered immediately what the difference was between that sound and a 97.658% one.

Finally, there was the “introspection layer”: Integretron sunk deep hooks into its victims’ computers’ operating systems, which allowed it to monitor all the network traffic arriving at or leaving each computer, as well as every pixel that was painted on the screen. Every ten minutes, this data was packaged up as a summary report and fired off to Integretron for analysis, which would sound an alarm if it detected anything suspicious, like network connections to chat sites, or dialog boxes onscreen that might contain hints from a co-conspirator.

Great. So if you have a chat program running in the background that’s automatically fetching your DMs and mentions, those invisible, silent network processes would finger you as a cheater. If your antivirus or operating system pops up a nag dialog reminding you to update your software, bam, you’re also a cheater. I know a lot about computers, and I’ve spent many hours turning off everything that can pop up, pop under, or sound an alert unless it comes from a very short list of people I want to hear from no matter what, and my computer still interrupts me several times per week to tell me something I 100 percent don’t care about. For programmers and product designers, the temptation to throw a message bubble up on top of everything else on the screen is just irresistible.

Dee didn’t want to cheat on his tests, but he did want to do totally normal, non-cheating things: mutter to himself as he worked out answers, look out the window while he pondered them, get up and stretch when he was stuck. The stuff I’d done in every test I’d ever taken. Integretron’s answer? “Computer says no.”

I figured that beating Integretron wouldn’t be that hard. After all, computers are flexible, even if the education system isn’t. The only computer we know how to make is the one that can run all the programs we can write. Computer scientists call it the “Turing-Complete Von Neumann Machine,” but you probably call it “a computer.” The same underlying, theoretical capabilities are in your thermostat, a kid’s smart toy, a surveillance doorbell, a phone, a laptop, a supercomputer and a beige PC rotting in a landfill. Some of these are faster and some are slower, some have more storage or RAM than others, but fundamentally, they all work the same way.

That means that when Integretron asks your computer what network traffic it’s sending and receiving, what images are entering its camera and what sounds are hitting its mic, it’s relying on a program to tell the truth, and you can just write a different program that lies. Maybe it’ll be hard to make a convincing lie (for example, if you show the system a picture of a kitten instead of a test-taker, someone will notice that something’s wrong, even if the computer convincingly swears that’s what it’s seeing), but telling the lie? That’s easy.

I set about trying to figure out how to lie to Integretron. The obvious place to start was with a “virtual machine,” or VM. That’s a computer program that simulates a whole computer (because a computer is a gadget that can run any programs…including a program that pretends to be a gadget that can run any program!). You run a VM, and then you run an operating system inside of it, and then you run programs inside of that.

But when you run a VM, you get to play God. Like, when your computer runs a program that says, “Give me a copy of all the network traffic going through the wifi card,” that’s what happens. But with the VM, there’s another layer of indirection. The program asks the VM’s operating system for the wifi data. The operating system asks the VM for the wifi data. Then the VM asks the real operating system for the wifi data.

But you control the VM and the data it gets from the operating system. You can filter out some of that data, or all of it. You can substitute different data for it. You can tell the VM that your computer doesn’t have a wifi card and the VM will believe you, and repeat the lie to the OS running inside of the VM. Running a VM is like creating your own little blue-pilled Matrix universe where you can control reality and all the programs running inside it can’t tell the bare-metal truth of your actual computer from the fiction you feed it.

I fired up a VM and put a copy of Integretron into the Matrix. Using diagnostic tools, I could see it trying to figure out if it was being tricked: as soon as it fired up, it tried to reach a bunch of nonsense URLs, like POHBSUOCEAGZXZPKAKLKBHQEWHYVMXRCEZVVWVDMEYCKCZIFGVOBTYLBWSTTRKCKLOCEIPCOGAUEEVAYYQQQTNFFAEQFDGH.com. I recognized the trick right away: the Integratron software was trying to figure out if it was in a VM.

You see, a lot of forensic VMs will answer all network requests in an effort to figure out how a piece of malicious software is communicating with its author or a “command and control” software. By answering any “are you there?” requests with a “here I am,” these tools try to psych out the malware, trapping it, so it spills its guts. They’re hoping that it will answer with “Great! I’m so glad I found you. Here’s all the secret stuff I’ve gathered on the computer I’ve hijacked and lots of clues about what I might do next.”

This trap has a counter-trap. If you’re a malware creep and you’re worried that your weapon will be caught and dissected under a VM’s microscope, you can have it run its own psych-out, say, by trying to contact websites that don’t exist, like POHBSUOCEAGZXZPKAKLKBHQEWHYVMXRCEZVVWVDMEYCKCZIFGVOBTYLBWSTTRKCKLOCEIPCOGAUEEVAYYQQQTNFFAEQFDGH.com. If those nonexistent websites reply, then you know you’ve been blue-pilled, stuck in a VM’s Matrix, and you can shut down, erase yourself, and deny your enemy any intel about your operations.

But this can backfire! A(nother) hacker named Marcus, Marcus Hutchins, killed a piece of malware called WannaCry, ending a global ransomware emergency.

You see, WannaCry was North Korean state ransomware, and it tried to evade analysis by checking to see whether it could contact a site called iuqerfsodp9ifjaposdfjhgosurijfaewrwergwea.com each time it ran. Hutchins—who is definitely the superior hacker-named-Marcus, no question—paid $11 to register that domain and stuck a website on the other end of that gnarly domain—not even a real website, just enough of a server presence that it would pick up and say “Hello” when this worm reached out and said “ring, ring?” Within a few hours, every instance of this North Korean state malware had gone dormant. The world was saved.

(Plot twist: Hutchins had done some dumb hacking stuff as a kid that got him arrested after he saved the world. Don’t worry, he had a great lawyer and he’s a free man again.)

Let’s sum up:

  • Security researchers try to trap malware by sticking it into a VM where they can safely observe it.
  • Malware creeps try to trap security researchers by checking for an answer from impossible web servers as a way to detect whether the software’s running inside a VM.  
  • But if someone creates one of those impossible web servers for a mere $11, every instance of the malware in the whole world assumes it’s in a VM and shuts down.

Can you guess what Integretron did to stop someone from shutting down every copy of the software in the world?

Right. They don’t just check for one impossible web address: they try to contact addresses at random.

But—you may be asking—how can they be sure that a random web address is truly impossible? Like, what if there really was a POHBSUOCEAGZXZPKAKLKBHQEWHYVMXRCEZVVWVDMEYCKCZIFGVOBTYLBWSTTRKCKLOCEIPCOGAUEEVAYYQQQTNFFAEQFDGH.com?

Well, there can’t be. That domain has ninety-five characters in it, and .com domains can only be sixty-three characters long. So that is a truly impossible domain, and if you try to contact it and get a yes, well, saddle up Neo, because you’re in the Matrix.

This is halfway clever, but only halfway. Here’s the thing: if all the random domains are impossible because they’re more than sixty-three characters long, then all it takes to trick Integretron is to tell the VM to ignore internet requests for sites that match that pattern.

This is pretty obvious in hindsight, though I’ll admit that it took me two days of beating my brains out to come up with it, and then about ten minutes to configure my forensic VM to do it.

I told you I’m the dumber Marcus.

But once I had Integretron running in its VM, I had God-mode on it. It took me two days to work out how to do a mute button for the mic—I wrote a script that gathered “room tone,” the near-silence of an empty room, and then looped it into the mic’s input. Then, just because I was in a “yak-shaving” mood (that means that I was screwing around doing stuff that didn’t need to be done), I added a routine that injected synthetic keyboard noises into the stream whenever the user was typing, so that a suspicious proctor wouldn’t be tipped off by the silence when the user was typing.

Next was figuring out how to loop the video. This is an old trope: take a video of an empty corridor and loop it into the feed of the security camera. That’s great for empty corridors, but not so much for humans, who are generally not perfectly still.

But I figured out how to defeat Integretron with its own idiocy. In Integretron’s bizarre universe, a test-taker who wasn’t typing an answer was supposed to sit perfectly still, eyes locked on the screen, barely breathing or blinking. This is a task that computers are much better at than any human could possibly be.

Ever heard of the Turing Test? That’s a thought experiment that the queer computer science and cryptography hero Alan Turing proposed to evaluate claims of artificial intelligence. To pass the test, the AI has to converse with a human being without that human guessing that they’re conversing with a computer.

Today, we use baby Turing Tests all the time to try to block bots. Ever have to identify all the stoplights in a photo grid or retype some distorted numbers and letters into a box on a web form? That’s called a CAPTCHA (“Completely Automated Public Turing Test to Tell Computers and Humans Apart”), and it’s supposed to lock out bots and let in people.

Integretron was kind of a reverse Turing Test: you passed it by being as robotic as possible, modifying your behavior and even your eye movements so that a computer could understand them. Because Integretron really wanted you to behave like a computer program, it was actually pretty easy to trick Integretron with a computer program.

I wrote a little loop-making app. You fired it up before your test and followed some prompts to create a video loop in which you were being a good little Integretron robot, eyes moving back and forth across the screen but never looking away from it, hands perfectly still, body perfectly still, hardly breathing, the room over your shoulder in a state of frozen stasis.

Once this video was saved, you could trigger it with a keyboard combination that the VM would intercept—but not pass onto Integretron—which would then swap in the video loop. You could get up and have a piss, get a snack, throw up from stress, remind your kid brother to keep it down because you’re taking a test, and the software—and any human watchers—would be none the wiser.

I almost screwed myself: remember that cool feature where I mixed in keyboard noise to the room tone if I detected typing? I forgot to turn it off during the video loop which meant that if you did some typing while you were wearing a cloak of invisibility (which is what I called my app) the app would actually generate typing sounds and make sure they were inserted into the audio stream.

As the great philosopher David St. Hubbins reminds us, “There’s such a fine line between clever and stupid.”

With that final bug squashed, I had a fully operational battle station to turn over to Dee for a test drive.

Dee’s family’s apartment was hella small, but neat and well-kept. He shared his bedroom with Alonso, a younger brother whose evident passions for Pokémon and the Golden State Warriors were manifested by a solid collage of posters, printouts and magazine ads that crawled up the wall and terminated in a neat equator that divided Alonsoland from Planet Dee.

Dee’s half of the room was no less busy, but the walls were covered with his trademark intricate line-doodles, mostly on three-ring binder paper, each squared against another’s corners and ascending to the ceiling line. His desk sported a cup of extra-fine felt-tip pens, a duct-tape-fixed charger brick, and a decade-old HP laptop, the keycaps worn to translucent plastic blanks with the ghosts of letters inscribed upon them. My own laptop—a DIY machine from Framework that I’d assembled myself—was pretty beat up, but Dee’s wasn’t just roadworn, it was old, with that brittle look of plastic that’s been exposed to too much oxidizing air and punishing UV, like you could crumble it in your palm.

“Okay,” I said. “I’ve got the teacher’s side of the app running on my laptop. What I want to do is hand it to you and then I’ll log in as a student from your computer, and you’ll be able to see how it works.”

Dee’s “okay” was as small as an ant. I looked long at him, trying to figure out what he was feeling. I was so aware that he had trouble advocating for himself, and I was haunted by the possibility that I’d ride roughshod over him and he wouldn’t say a word.

“Okay,” I said again, and got to work setting up my app on his old HP.

It took a lot longer than I thought it would. He was still running Windows 10, an OS I had spent a lot of time avoiding. The VM didn’t want to work in his system, and I spent an hour running down patches and workarounds. Then I had to install Python and my libraries on his machine—a brutally slow process, thanks to the terrible broadband in his apartment building. After forty-five minutes and six failed downloads, I switched to a tethered connection on my phone, which was a little better, though the cellular service in his neighborhood was nearly as bad as his broadband. How the hell did Dee do Zoom school at all in this broadband desert?

Finally, I had it all set up: Python, the VM, the patches to make the VM work. Dee had wandered into the kitchen for a while and come back with grilled cheese sandwiches for us, and when I asked for hot sauce, he actually smiled at me and then came back with a no-label bottle of intriguing, oily red liquid.

“My dad’s,” he said, with pride. “He says no one sells good hot sauce anymore, so he makes his own.”

It. Was. So. Good. Spicy, sure, but with so much flavor, a little cinnamon, some lemon rind, and even a hint of vanilla. I filed mental tasting notes to take home for Ange, and I even worked up the nerve to ask for the recipe.

“Huh-uh,” Dee said, his smile broadening. “Dad’s secret. He says he’ll tell me when I’m eighteen, if I promise to keep the secret. But I can give you a bottle. He makes a batch about once a month and he likes it when someone who appreciates it takes one away.”

“I have to introduce your dad to my wife someday. I bet they could nerd out about hot sauce for hours.”

I tucked the hot sauce bottle—an old glass ketchup bottle with a tight-screwed lid—into my backpack and went back to wrestling with the laptop. Just one more bug to squash and—

“Okay, let’s do this,” I said, and fired up the teacher-side Integretron app on my laptop and handed it to him. Then I let him watch as I created my room-tone audio loop and robotic obedient test-taker video loop and then logged in as a student.

At first, all Dee wanted to do was poke around on the teacher’s interface. Watching him explore the other side of the curtain was an education in itself, and as he grew engrossed, he lost his shyness and awkwardness, keeping up a running monolog about what he found. “There’s an eyeball score! Marcus, look at the camera and, like, look around, okay?” I did, and then he took me through a series of experiments about what kinds of eye movements were perceived as “suspicious” by the algorithm.

“Dee,” I said, after ten minutes of this. “I don’t want to discourage you, but aren’t you worried that knowing all the ways this thing spies on you will make you more self-conscious? I mean, didn’t you say that Integretron makes you freeze up because you’re being watched?”

He shook his head. “Look up and to the left again, and then back to the middle, and then again, okay?” I did. “No, I’m not worried about that. The thing is that I never knew how I was being watched, what triggered it.” He had me do some more eye-movements.

“You know Nineteen Eighty-Four?” he asked. I couldn’t stop myself from laughing and he came over shy again and I felt terrible.

“Sorry, sorry. Yeah. I used to go by Winston online, spelled W15T0N.”

“Weird,” he said. “Okay.” He poked around the teacher UI some more. “You know how Winston’s apartment had that spot, that ‘shallow alcove,’ where the telescreen couldn’t see him?”

“Yeah,” I said. “I remember that part. You like that book, huh?”

“We did it in English last year. It was okay. But I liked how Winston, he had that corner where he could sit, and he could just be himself. Think his thoughts. When I think, my eyes are all over the place”—which explained his fascination with the eye-tracker and made it take on a certain horror-movie aspect—“and sometimes I draw. When I’m doing that, my thoughts, they just kind of snap into place. If I have to think about them too hard, it’s like trying to force two Lego bricks together, instead of just going smooth and slow and finding the spot where they click.”

“That makes so much sense, Dee. Yeah, okay. The app I made for you, I hope it’ll give you a corner where the telescreen can’t see you.”

“That’s what I hope, too.” He really had an angel’s smile.

“Okay,” I said. “Let’s try on the cloak of invisibility.”

A virtual machine is a computer program that pretends to be a computer. That’s a lot of work. Luckily, computers are getting faster all the time, and most programs don’t come close to red-lining the RAM or processors on the computers they run on. If you’re running a program designed for last year’s computer on this year’s model, chances are it will have capacity to spare.

But Dee’s computer wasn’t last year’s model. It was eight years old, and no one had ever upgraded the RAM it shipped with. It had an old-fashioned, mechanical, slow hard drive, the kind with a spinning platter that was one thousandth the speed of a modern solid-state drive. He was a careful and thoughtful person and I couldn’t find any of the malware that might slow down an old computer if you clicked a bad email link.

But that computer was so old it didn’t need malware to be too slow to be usable.

My invisibility cloak didn’t work. It just didn’t. Between the VM and the audio loop and the video loop and the crappy on-board graphics chip, the machine just crawled. Looking over Dee’s shoulder at the teacher’s interface confirmed that it could tell something was wrong, though it didn’t know what. The error message read, “Test taker’s computer is too slow to run Integretron. Provide test-taker with minimum specifications for Integretron success” and a button to “Terminate test and send message.”

Shit.

Tina had just asked for second helpings of Ange’s jerk tofu when I remembered that I still had a bottle of Dee’s father’s hot sauce in my bag. I got it and presented it to Ange and we watched as she dabbed some on the side of her plate and tasted some off her finger and then off a nugget of jerk-slathered tofu.

“Holy crap that’s good,” Ange said, then turned to Tina. “You’ll love it.” Ange’s sister liked hot food, but she was a civilian bystander in the hot sauce wars, which Ange had been prosecuting against her taste buds and esophageal lining since she was a tween.

Tina tried it and agreed, and then I needed a second helping, too, which polished off the tofu.

“Where’d you get it?” Ange said, reaching for the bottle.

“Tina’s friend,” I said. “Dee. His dad makes it.”

“Dee,” Tina said. “What a mess. Were you able to help him?”

I shrugged. “Not yet. I think I have to get him a more recent laptop to run the software I cooked up. I’m not enough of a code ninja to make it run on his old machine. I’ve got a retired ThinkPad he can have.”

Tina cocked her head at me and looked so much like Ange that I nearly laughed. They were four years apart, and fiercely protective of one another. The first time I’d met Tina, she’d declared me barely acceptable and then told me that if I ever hurt Ange, she’d “track you down and pull your scrotum over your head.” Our relationship had improved since, but there was never any question as to which side Tina would take in any dispute.

“Hold up, why does Dee need a new computer?”

“Because he’s got this ancient HP that won’t run the software I wrote for him.”

Her eyebrows did that thing that makes me take shelter when Ange does it and I started to worry.

“You wrote him software?”

“Yeah,” I said, the sinking feeling growing. “I made him an invisibility cloak.”

“An invisibility cloak.”

Not a question, but it demanded an answer, so I explained how it worked, all the cool research I’d done, the awesome ways I’d come up with to beat the system.

“Oh, Marcus,” Tina said when I finished. She didn’t sound angry anymore, but she did sound exhausted. “It was really nice of you to do all of that for Dee, but what made you think that it would help him?”

My first reaction was to get all defensive: what did she mean what made me think that it would help him? He asked me for help. She told him to ask me for help! And if she thought I was wrong, why didn’t she say so? What’s this Socratic dialogue bullshit?

There was a time when I would have gone with that first reaction, but I’d learned to recognize the physical signs that someone had put me on tilt and screwed up my capacity to reason. My pulse was up, my hands felt a little numb, my face felt tight. My thoughts, ergo, were not to be trusted.

“Sorry,” I said, “give me a second.” I drank some water and took a couple deep breaths. Then a couple more. I tried to hold Tina’s question at arm’s length and figure out why it made me so upset.

“Hey, Tina,” I said. She and Ange were looking at me with expressions of concern so similar that it made me smile, and that made me calm down a little more. “Sorry,” I said again. “I guess maybe I missed something. What did you think I would do for Dee when he asked for help?”

“Well, he’d talked about writing something about that horrible test-taking app, like an open letter to the school and the board that he’d publish online or maybe present at one of their meetings. Didn’t he mention that?”

He hadn’t. Had I given him room to? Had I even considered that he might want something other than a cloak of invisibility? I thought back to all those times I’d watched him struggle to speak up. I would never have imagined that he wanted to get up in front of a bunch of adults and tell them how their decisions had affected him. I’d assumed that he needed a protector, not a sidekick who could help him research his talking points.

“Oh, shit,” I said.

“Yeah,” Tina said.

“It’s okay,” Ange said. “I mean, if nothing else, your cloak of invisibility will make a hell of a demo when Dee tells ’em that they’ve wasted their money on an automated cruelty machine.” She half rose out of her seat so she could reach over and give the back of my neck a friendly squeeze. That was one of the places that got very, very tense when I felt upset.

I met Dee outside the La Escuelita Elementary main entrance. The sun was just setting, and we both squinted at each other. I’d actually put on my funeral suit for the event. Dee wore jeans whose doodles extended all the way to the cuffs and a hoodie that was covered in pictures of license plates. I smiled and pointed at it.

“That’s one of those license-plate camera hoodies, right?” Any automated license-plate reader that took Dee’s picture would register him as about twenty slow-moving cars. Best of all, the license plates spelled out the text of the Fourth Amendment. 

He smiled back. What a smile that kid had! “I wore it in your honor.”

When I’m out of my depth, I blurt. I blurted., “Are you gonna be able to do this?”

The smile vanished. He got very serious and very thoughtful. “They have to know,” he said. “They have to know how this feels for people like me. If I don’t tell them, who will?”

He was shy at the mic, so shy, eyes downcast, shoulders slumped. But he leaned way down into the mic and spoke, and I watched as the AV tech assessed the situation and hit the slider for the audience mic and cranked it way up, so that Dee’s soft, halting delivery filled the room, like a secret being whispered out of every PA speaker.

“This is how the eye tracking works,” he said, and I hit the slide advance, bringing up the hand-drawn graphic he’d made based on the data I’d collated for him, a set of faces—agonized, thoughtful, engrossed—each with out-of-bounds eyes and the probable cheating score that the system ascribed to each written in red ballpoint and traced and retraced, heavily underlined. The gutters between each face were filled with Dee’s intricate traceries.

He left it there for a long while, saying nothing, the hum from his mic rising as we all took it in.

“This isn’t a system that detects cheating,” he said. “None of this is cheating.” He let that hang for a long time. “Next slide,” he said.

The final slide was a kicker: the Oakland Unified’s budget for Integretron versus its budget for many other items: bigger than the budget for after-school STEM clubs, for example. “Which means we’re spending more on abusing kids with computers than we are on teaching them how to use computers.”

If it had been me, I’d have gone on and on and on this point, it was so outrageous. But Dee used words more carefully than me. He said his piece and he stepped away from the mic.

Then, just as the frozen members of the board on the stage were figuring out what to say to him, he stepped back up.

“Thank you for giving me a chance to tell you how it feels to be a student.”

Dee met me at the cafe again and handed me another bottle of his dad’s hot sauce, bigger this time. “Tina told me her sister really likes it,” he said, around his latte.

“We all like it,” I said.

“That shallow alcove where Winston Smith’s telescreen couldn’t see him, you remember it, right?”

“I do,” I said.

“It could see him the whole time. Remember that?”

I opened and closed my mouth a few times. He smiled. That smile! “Everyone forgets that. Even my English teacher, when we were talking about it in class.”

A man of few words, Dee. He waited for me to figure it out.

“Winston thought he could make his life better by hiding from the telescreen,” I said.

“Right,” Dee said. “Right. But what he really needed to do was abolish the telescreen.”

Ange loved the hot sauce. Tina told me I’d done right in the end.

And Dee? He’s graduating this spring, and he’s spending his final semester working with a student group to figure out a plan for spending the money they’re saving by not paying Integretron. I want him to talk to a lawyer about the way that the CEO of the company doxed him on Reddit, but he says he’s got better things to do.

Buy the Book

Vigilant
Vigilant

Vigilant

Cory Doctorow

About the Author

Cory Doctorow

Author

Cory Doctorow (craphound.com) is a science fiction author, activist and journalist. He is the author of many books, most recently THE LOST CAUSE, a solarpunk science fiction novel of hope amidst the climate emergency. His most recent nonfiction book is THE INTERNET CON: HOW TO SEIZE THE MEANS OF COMPUTATION, a Big Tech disassembly manual. Other recent books include RED TEAM BLUES, a science fiction crime thriller; CHOKEPOINT CAPITALISM, nonfiction about monopoly and creative labor markets; the LITTLE BROTHER series for young adults; IN REAL LIFE, a graphic novel; and the picture book POESY THE MONSTER SLAYER. In 2020, he was inducted into the Canadian Science Fiction and Fantasy Hall of Fame.
Learn More About Cory
Subscribe
Notify of
guest
7 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments